Multi-dimensional balanced graph partitioning via projected gradient descent

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Spectral Compressed Sensing via Projected Gradient Descent

Let x ∈ C be a spectrally sparse signal consisting of r complex sinusoids with or without damping. We consider the spectral compressed sensing problem, which is about reconstructing x from its partial revealed entries. By utilizing the low rank structure of the Hankel matrix corresponding to x, we develop a computationally efficient algorithm for this problem. The algorithm starts from an initi...

متن کامل

Material for “ Spectral Compressed Sensing via Projected Gradient Descent ”

We extend PGD and its recovery guarantee [1] from one-dimensional spectrally sparse signal recovery to the multi-dimensional case. Assume the underlying multi-dimensional spectrally sparse signal is of model order r and total dimension N . We show that O(r log(N)) measurements are sufficient for PGD to achieve successful recovery with high probability provided the underlying signal satisfies so...

متن کامل

A Polynomial Algorithm for Balanced Clustering via Graph Partitioning

The objective of clustering is to discover natural groups in datasets and to identify geometrical structures which might reside there, without assuming any prior knowledge on the characteristics of the data. The problem can be seen as detecting the inherent separations between groups of a given point set in a metric space governed by a similarity function. The pairwise similarities between all ...

متن کامل

Doubly Balanced Connected Graph Partitioning

We introduce and study the Doubly Balanced Connected graph Partitioning (DBCP) problem: Let G=(V,E) be a connected graph with a weight (supply/demand) function p:V→ {−1,+1} satisfying p(V )= ∑ j∈V p(j)=0. The objective is to partition G into (V1, V2) such that G[V1] and G[V2] are connected, |p(V1)|, |p(V2)|≤cp, and max{ |V1| |V2| , |V2| |V1|}≤cs, for some constants cp and cs. When G is 2-connec...

متن کامل

Learning ReLUs via Gradient Descent

In this paper we study the problem of learning Rectified Linear Units (ReLUs) which are functions of the form x ↦ max(0, ⟨w,x⟩) with w ∈ R denoting the weight vector. We study this problem in the high-dimensional regime where the number of observations are fewer than the dimension of the weight vector. We assume that the weight vector belongs to some closed set (convex or nonconvex) which captu...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the VLDB Endowment

سال: 2019

ISSN: 2150-8097

DOI: 10.14778/3324301.3324307